Web Survey Bibliography
Title Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected by the Interview Situation?
Author Niebruegge, S.
Year 2017
Access date 15.09.2017
Abstract RELEVANCE & RESEARCH QUESTION
Screens are everywhere. And so, of course, are interviews. Market research now happens in real life.
The author emphasizes the importance of the interview and its environment for several reasons: It’s the core of a good research practice. Its costs heavily affect the economic health of research businesses. The fact that we don’t see the actual interview environment might make researchers unaware of potential impacts on the answering behaviour. Panel interviews compete with multiple distractions that come with ubiquitous devices. We can assume that the interview environment is under constant change. Last not least, we need to include in our equation the beginning shift from the interview to the observation.
METHODS & DATA
A survey with a total N = 1.049 provides a comprehensive and representative picture of present-day interview environments. The respondents were free to choose time, place and device. The consistency and commitment to the online interview was measured using a fit statistic from a MaxDiff exercise.
RESULTS
A large share of panel interviews is done at home. Only 2 % of the interviews can be classified as truly mobile (out-of-home, using a mobile data connection). 88 % of the resp. show a 100-%-consistency in their answering behaviour.
The quality of the answering behaviour is largely influenced by non-situational parameters such as the general personality trait of honesty and truthfulness as measured with the HEXACO-60 personality inventory. It’s not or only to a neglectable extent affected by parameters of the actual interview situation. But, there are a few remarkable exceptions such as the consumption of alcohol prior to the interview.
ADDED VALUE
For research designs, it’s key to keep in mind in which environment panel interviews take place. For some research designs that expand the scope from lab situations to the real world the very low share of truly mobile interviews is bad news, whereas results indicate that interview environments are more homogeneous than expected.
Screens are everywhere. And so, of course, are interviews. Market research now happens in real life.
The author emphasizes the importance of the interview and its environment for several reasons: It’s the core of a good research practice. Its costs heavily affect the economic health of research businesses. The fact that we don’t see the actual interview environment might make researchers unaware of potential impacts on the answering behaviour. Panel interviews compete with multiple distractions that come with ubiquitous devices. We can assume that the interview environment is under constant change. Last not least, we need to include in our equation the beginning shift from the interview to the observation.
METHODS & DATA
A survey with a total N = 1.049 provides a comprehensive and representative picture of present-day interview environments. The respondents were free to choose time, place and device. The consistency and commitment to the online interview was measured using a fit statistic from a MaxDiff exercise.
RESULTS
A large share of panel interviews is done at home. Only 2 % of the interviews can be classified as truly mobile (out-of-home, using a mobile data connection). 88 % of the resp. show a 100-%-consistency in their answering behaviour.
The quality of the answering behaviour is largely influenced by non-situational parameters such as the general personality trait of honesty and truthfulness as measured with the HEXACO-60 personality inventory. It’s not or only to a neglectable extent affected by parameters of the actual interview situation. But, there are a few remarkable exceptions such as the consumption of alcohol prior to the interview.
ADDED VALUE
For research designs, it’s key to keep in mind in which environment panel interviews take place. For some research designs that expand the scope from lab situations to the real world the very low share of truly mobile interviews is bad news, whereas results indicate that interview environments are more homogeneous than expected.
Access/Direct link Conference Homepage (abstract) / (presentation)
Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (364)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Comparing acquiescent and extreme response styles in face-to-face and web surveys; 2017; Liu, M.; Conrad, F. G.; Lee, S.
- Respondent mode choice in a smartphone survey ; 2017; Conrad, F. G., Schober, M. F., Antoun, C., Yan, H. Y., Hupp, A., Johnston, M., Ehlen, P., Vickers, L...
- Effects of Mobile versus PC Web on Survey Response Quality: a Crossover Experiment in a Probability...; 2017; Antoun, C.; Couper, M. P.; G. G.Conrad, F. G.
- Methods for Evaluating Respondent Attrition in Web-Based Surveys; 2016; Hochheimer, C. J.; Sabo, R. T.; Krist, A. H.; Day, T.; Cyrus, J.; Woolf, S. H.
- Mobile-only web survey respondents; 2016; Lugtig, P. J.; Toepoel, V.; Amin, A.
- Using official surveys to reduce bias of estimates from nonrandom samples collected by web surveys; 2016; Beresovsky, V.; Dorfman, A.; Rumcheva, P.
- Making use of Internet interactivity to propose a dynamic presentation of web questionnaires; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- Helping respondents provide good answers in Web surveys; 2016; Couper, M. P.; Zhang, C.
- Gamifying. Not all fun and games; 2016; Stubington, P.; Crichton, C.
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- Are sliders too slick for surveys?; 2016; Buskirk, T. D.
- Research gamification for quality pharmaceutical stakeholder insights; 2016; Mondry, B.; Fink, L.
- SurveyTester from Knowledge Navigators ; 2016; Macer, T.
- Simplifying your mobile solution; 2016; Berry, K.
- Effects of motivating question types with graphical support in multi channel design studies; 2016; Luetters, H.; Friedrich-Freksa, M.; Vitt, SGoldstein, D. G.
- Why Do Web Surveys Take Longer on Smartphones?; 2016; Couper, M. P.; J. J.Peterson, G. J.
- Usability Testing within Agile Process; 2016; Holland, T.
- Association of Eye Tracking with Other Usability Metrics ; 2016; Olmsted, E. L.
- Cognitive Probing Methods in Usability Testing – Pros and Cons; 2016; Nichols, E. M.
- Thinking Inside the Box Visual Design of the Response Box Affects Creative Divergent Thinking in an...; 2016; Mohr, A. H.; Sell, A.; Lindsay, T.
- Distractions: The Incidence and Consequences of Interruptions for Survey Respondents ; 2016; Ansolabehere, S.; Schaffner, B. F.
- The Effect of CATI Questions, Respondents, and Interviewers on Response Time; 2016; Olson, K.; Smyth, J. D.
- New Generation of Online Questionnaires?; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- The Analysis of Respondent’s Behavior toward Edit Messages in a Web Survey; 2016; Park, Y.
- Effects of Data Collection Mode and Response Entry Device on Survey Response Quality; 2016; Ha, L.; Zhang, Che.; Jiang, W.
- Navigation Buttons in Web-Based Surveys: Respondents’ Preferences Revisited in the Laboratory; 2016; Romano Bergstrom, J. C.; Erdman, C.; Lakhe, S.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- A Technical Guide to Effective and Accessible web Surveys; 2016; Baatard, G.
- The Validity of Surveys: Online and Offline; 2016; Wiersma, W.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- A Framework of Incorporating Thai Social Networking Data in Online Marketing Survey; 2016; Jiamthapthaksin, R.; Aung, T. H.; Ratanasawadwat, N.
- Creation and Usability Testing of a Web-Based Pre-Scanning Radiology Patient Safety and History Questionnaire...; 2016; Robinson, T. J.; DuVall, S.; Wiggins III, R
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- Taming Big Data: Using App Technology to Study Organizational Behavior on Social Media; 2015; Bail, C. A.
- A Meta-Analysis of Breakoff Rates in Mobile Web Surveys; 2015; Mavletova, A. M.; Couper, M. P.
- Optimizing the Decennial Census for Mobile – A Case Study; 2015; Nichols, E. M.; Hawala, E. O.; Horwitz, R.; Bentley, M.
- Using Video to Reinvigorate the Open Question; 2015; Cape, P.
- Are Sliders Too Slick for Surveys? An Experiment Comparing Slider and Radio Button Scales for Smartphone...; 2015; Aadland, D.; Aalberg, T.
- Web Surveys Optimized for Smartphones: Are there Differences Between Computer and Smartphone Users?; 2015; Andreadis, I.
- Designing web surveys for the multi-device internet; 2015; de Bruijne, M.
- Data Quality Standards in Mixed Mode Surveys; 2015; Bremer, J.; Barbulescu, M.; Bennett, J.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- A Review of Issues in Gamified Surveys; 2015; Keusch, F.; Zhang, Che.